46 research outputs found

    Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action

    Get PDF
    Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Frontiers in Neural Circuits. 2012;6:108.Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor

    The Behavioral Relevance of Landmark Texture for Honeybee Homing

    Get PDF
    Honeybees visually pinpoint the location of a food source using landmarks. Studies on the role of visual memories have suggested that bees approach the goal by finding a close match between their current view and a memorized view of the goal location. The most relevant landmark features for this matching process seem to be their retinal positions, the size as defined by their edges, and their color. Recently, we showed that honeybees can use landmarks that are statically camouflaged, suggesting that motion cues are relevant as well. Currently it is unclear how bees weight these different landmark features when accomplishing navigational tasks, and whether this depends on their saliency. Since natural objects are often distinguished by their texture, we investigate the behavioral relevance and the interplay of the spatial configuration and the texture of landmarks. We show that landmark texture is a feature that bees memorize, and being given the opportunity to identify landmarks by their texture improves the bees’ navigational performance. Landmark texture is weighted more strongly than landmark configuration when it provides the bees with positional information and when the texture is salient. In the vicinity of the landmark honeybees changed their flight behavior according to its texture

    Prototypical Components of Honeybee Homing Flight Behavior Depend on the Visual Appearance of Objects Surrounding the Goal

    Get PDF
    Honeybees use visual cues to relocate profitable food sources and their hive. What bees see while navigating, depends on the appearance of the cues, the bee’s current position, orientation, and movement relative to them. Here we analyze the detailed flight behavior during the localization of a goal surrounded by cylinders that are characterized either by a high contrast in luminance and texture or by mostly motion contrast relative to the background. By relating flight behavior to the nature of the information available from these landmarks, we aim to identify behavioral strategies that facilitate the processing of visual information during goal localization. We decompose flight behavior into prototypical movements using clustering algorithms in order to reduce the behavioral complexity. The determined prototypical movements reflect the honeybee’s saccadic flight pattern that largely separates rotational from translational movements. During phases of translational movements between fast saccadic rotations, the bees can gain information about the 3D layout of their environment from the translational optic flow. The prototypical movements reveal the prominent role of sideways and up- or downward movements, which can help bees to gather information about objects, particularly in the frontal visual field. We find that the occurrence of specific prototypes depends on the bees’ distance from the landmarks and the feeder and that changing the texture of the landmarks evokes different prototypical movements. The adaptive use of different behavioral prototypes shapes the visual input and can facilitate information processing in the bees’ visual system during local navigation

    Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task

    Get PDF
    Mertes M, Dittmar L, Egelhaaf M, Boeddeker N. Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Frontiers in Behavioral Neuroscience. 2014;8:335.Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks-salient objects in the vicinity of the goal location-can play an important role in guiding the animal's homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bee's visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bee's visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing

    Blowfly flight characteristics are shaped by environmental features and controlled by optic flow information

    Get PDF
    Kern R, Boeddeker N, Dittmar L, Egelhaaf M. Blowfly flight characteristics are shaped by environmental features and controlled by optic flow information. Journal of Experimental Biology. 2012;215(14):2501-2514.Blowfly flight consists of two main components, saccadic turns and intervals of mostly straight gaze direction, although, as a consequence of inertia, flight trajectories usually change direction smoothly. We investigated how flight behavior changes depending on the surroundings and how saccadic turns and intersaccadic translational movements might be controlled in arenas of different width with and without obstacles. Blowflies do not fly in straight trajectories, even when traversing straight flight arenas; rather, they fly in meandering trajectories. Flight speed and the amplitude of meanders increase with arena width. Although saccade duration is largely constant, peak angular velocity and succession into either direction are variable and depend on the visual surroundings. Saccade rate and amplitude also vary with arena layout and are correlated with the 'time-to-contact' to the arena wall. We provide evidence that both saccade and velocity control rely to a large extent on the intersaccadic optic flow generated in eye regions looking well in front of the fly, rather than in the lateral visual field, where the optic flow at least during forward flight tends to be strongest

    Bumblebee Homing: The Fine Structure of Head Turning Movements

    Get PDF
    Boeddeker N, Mertes M, Dittmar L, Egelhaaf M. Bumblebee Homing: The Fine Structure of Head Turning Movements. PLoS ONE. 2015;10(9): e0135020.Changes in flight direction in flying insects are largely due to roll, yaw and pitch rotations of their body. Head orientation is stabilized for most of the time by counter rotation. Here, we use high-speed video to analyse head- and body-movements of the bumblebee Bombus terrestris while approaching and departing from a food source located between three landmarks in an indoor flight-arena. The flight paths consist of almost straight flight segments that are interspersed with rapid turns. These short and fast yaw turns (“saccades”) are usually accompanied by even faster head yaw turns that change gaze direction. Since a large part of image rotation is thereby reduced to brief instants of time, this behavioural pattern facilitates depth perception from visual motion parallax during the intersaccadic intervals. The detailed analysis of the fine structure of the bees’ head turning movements shows that the time course of single head saccades is very stereotypical. We find a consistent relationship between the duration, peak velocity and amplitude of saccadic head movements, which in its main characteristics resembles the so-called "saccadic main sequence" in humans. The fact that bumblebee head saccades are highly stereotyped as in humans, may hint at a common principle, where fast and precise motor control is used to reliably reduce the time during which the retinal images moves

    Finding home. Landmark ambiguity in human navigation

    Get PDF
    Jetzschke S, Fröhlich J, Ernst MO, Boeddeker N. Finding home. Landmark ambiguity in human navigation. Frontiers in Behavioural Neuroscience. 2017;11: 132.Memories of places often include landmark cues, i.e., information provided by the spatial arrangement of distinct objects with respect to the target location. To study how humans combine landmark information for navigation, we conducted two experiments: To this end, participants were either provided with auditory landmarks while walking in a large sports hall or with visual landmarks while walking on a virtual-reality treadmill setup. We found that participants cannot reliably locate their home position due to ambiguities in the spatial arrangement when only one or two uniform landmarks provide cues with respect to the target. With three visual landmarks that look alike, the task is solved without ambiguity, while audio landmarks need to play three unique sounds for a similar performance. This reduction in ambiguity through integration of landmark information from 1, 2, and 3 landmarks is well modeled using a probabilistic approach based on maximum likelihood estimation. Unlike any deterministic model of human navigation (based e.g., on distance or angle information), this probabilistic model predicted both the precision and accuracy of the human homing performance. To further examine how landmark cues are integrated we introduced systematic conflicts in the visual landmark configuration between training of the home position and tests of the homing performance. The participants integrated the spatial information from each landmark near-optimally to reduce spatial variability. When the conflict becomes big, this integration breaks down and precision is sacrificed for accuracy. That is, participants return again closer to the home position, because they start ignoring the deviant third landmark. Relying on two instead of three landmarks, however, goes along with responses that are scattered over a larger area, thus leading to higher variability. To model the breakdown of integration with increasing conflict, the probabilistic model based on a simple Gaussian distribution used for Experiment 1 needed a slide extension in from of a mixture of Gaussians. All parameters for the Mixture Model were fixed based on the homing performance in the baseline condition which contained a single landmark. from the 1-Landmark Condition. This way we found that the Mixture Model could predict the integration performance and its breakdown with no additional free parameters. Overall these data suggest that humans use similar optimal probabilistic strategies in visual and auditory navigation, integrating landmark information to improve homing precision and balance homing precision with homing accuracy

    Out of the box: how bees orient in an ambiguous environment

    Get PDF
    Dittmar L, Stürzl W, Jetzschke S, Mertes M, Boeddeker N. Out of the box: how bees orient in an ambiguous environment. Animal Behaviour. 2014;89:13-21.How do bees employ multiple visual cues for homing? They could either combine the available cues using a view-based computational mechanism or pick one cue. We tested these strategies by training honeybees, Apis mellifera carnica, and bumblebees, Bombus terrestris, to locate food in one of the four corners of a box-shaped flight arena, providing multiple and also ambiguous cues. In tests, bees confused the diagonally opposite corners, which looked the same from the inside of the box owing to its rectangular shape and because these corners carried the same local colour cues. These 'rotational errors' indicate that the bees did not use compass information inferred from the geomagnetic field under our experimental conditions. When we then swapped cues between corners, bees preferred corners that had local cues similar to the trained corner, even when the geometric relations were incorrect. Apparently, they relied on views, a finding that we corroborated by computer simulations in which we assumed that bees try to match a memorized view of the goal location with the current view when they return to the box. However, when extra visual cues outside the box were provided, bees were able to resolve the ambiguity and locate the correct corner. We show that this performance cannot be explained by view matching from inside the box. Indeed, the bees adapted their behaviour and actively acquired information by leaving the arena and flying towards the cues outside the box. From there they re-entered the arena at the correct corner, now ignoring local cues that previously dominated their choices. All individuals of both species came up with this new behavioural strategy for solving the problem provided by the local ambiguity within the box. Thus both species seemed to be solving the ambiguous task by using their route memory, which is always available during their natural foraging behaviour. (C) 2014 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved

    Visual gaze control during peering flight manoeuvres in honeybees

    Get PDF
    As animals travel through the environment, powerful reflexes help stabilize their gaze by actively maintaining head and eyes in a level orientation. Gaze stabilization reduces motion blur and prevents image rotations. It also assists in depth perception based on translational optic flow. Here we describe side-to-side flight manoeuvres in honeybees and investigate how the bees’ gaze is stabilized against rotations during these movements. We used high-speed video equipment to record flight paths and head movements in honeybees visiting a feeder. We show that during their approach, bees generate lateral movements with a median amplitude of about 20 mm. These movements occur with a frequency of up to 7 Hz and are generated by periodic roll movements of the thorax with amplitudes of up to ±60°. During such thorax roll oscillations, the head is held close to horizontal, thereby minimizing rotational optic flow. By having bees fly through an oscillating, patterned drum, we show that head stabilization is based mainly on visual motion cues. Bees exposed to a continuously rotating drum, however, hold their head fixed at an oblique angle. This result shows that although gaze stabilization is driven by visual motion cues, it is limited by other mechanisms, such as the dorsal light response or gravity reception
    corecore